EN FR
EN FR


Section: New Results

Real-time scheduling

Scheduling of tasks in automotive multicore ECUs

Participants : Aurélien Monot, Nicolas Navet, Françoise Simonot-Lion.

As the demand for computing power is quickly increasing in the automotive domain, car manufacturers and tier-one suppliers are gradually introducing multicore ECUs in their electronic architectures. Additionally, these multicore ECUs offer new features such as higher levels of parallelism which ease the respect of safety requirements such as the ISO 26262 and the implementation of other automotive use-cases. These new features involve also more complexity in the design, development and verification of the software applications. Hence, car manufacturers and suppliers will require new tools and methodologies for deployment and validation. We address the problem of sequencing numerous elementary software components, called runnables, on a limited set of identical cores. We show how this problem can be addressed as two sub-problems, partitioning the set of runnables and building the sequencing of the runnables on each core, which problems cannot be solved optimally due to their algorithmic complexity. We then present low complexity heuristics to partition and build sequencer tasks that execute the runnable set on each core, and derive lower bounds on their efficiency (i.e., competitive ratio). Finally, we address the scheduling problem globally, at the ECU level, by discussing how to extend this approach in the case where other OS tasks are scheduled on the same cores as the sequencer tasks. An article providing a summary of this line of work will appear in IEEE TII [14] .

Fine-grained hardware modeling in response time analyses

Participants : Dawood Khan, Nicolas Navet.

Early in the design cycle, the two main approaches for verifying timing constraints and dimensioning the networks are worst-case schedulability analysis and simulation. In [29] , we advocate that both provide complementary results and that, most often, none of them alone is sufficient. In particular, it is shown on automotive case-studies that response time distributions that can be derived from simulations cannot replace worst-case analysis. On the other hand, it is shown on examples that the analytical models, as used in worst-case analyses, are error-prone and often much simplified abstractions of the real system, which might lead to optimistic (i.e., unsafe) results.

As an illustration of the latter point, the classical WCRT analysis of Controller Area Network (CAN) implicitly assumes an infinite number of transmission buffers which is not the case in practice. This might lead high priority messages to suffer from priority inversion if the buffers are already occupied by low priority messages. This gives rise to an additional delay for high priority messages, which, if not considered, may result in a deadline violation. In an earlier work, we explained the cause of this additional delay and have extended the existing CAN schedulability analysis to integrate it. We have then studied the case where low-priority transmissions cannot be aborted because the communication controller or the driver does not allow it. We show on two case studies that the impact on response times is important and cannot be neglected in most real-time systems. This work was published in [26] .

Probabilistically analysable real-time system

Participants : Liliana Cucu-Grosjean, Codé Lo, Luca Santinelli, Dorin Maxim.

The adoption of more complex hardware to respond to the increasing demand for computing power in next-generation systems exacerbates some of the limitations of static timing analysis for the estimation of the worst-case execution time (WCET) estimation. In particular, the effort of acquiring (1) detail information on the hardware to develop an accurate model of its execution latency as well as (2) knowledge of the timing behaviour of the program in the presence of varying hardware conditions, such as those dependent on the history of previously executed instructions. These problems are also known as the timing analysis walls. The probabilistic timing analysis, a novel approach to the analysis of the timing behaviour of next-generation real-time embedded systems, provides answers to timing analysis walls. In [11] we have showed how the probabilistic timing analysis attacks the timing analysis walls. We have also presented experimental evidence that shows how probabilistic timing analysis reduces the extent of knowledge about the execution platform required to produce probabilistically-safe and tight WCET estimations.

Optimal scheduling policies for real-time systems with probabilistic execution times

Participants : Liliana Cucu-Grosjean, Luca Santinelli, Dorin Maxim, Olivier Buffet, Rob Davis [University of York] .

We have investigated the problem of optimal priority assignment in fixed priority preemptive single processor systems where tasks have probabilistic execution times. We have identified three sub-problems which optimise different metrics related to the probability of deadline failures. For each sub-problem we have proposed an algorithm that is proved optimal. The first two algorithms are inspired by Audsley’s algorithm which is a greedy (lowest priority first) approach that is optimal in the case of tasks with deterministic execution times. Since we prove that such a greedy approach is not optimal for the third sub-problem, we have proposed a tree search algorithm in this case. These results were published in [27] .

Statistical analysis of real-time systems

Participants : Liliana Cucu-Grosjean, Lu Yue, Thomas Nolte [Malardelan University] , Ian Bate [University of York] .

The response time analysis of real-time systems usually needs the knowledge of WCET estimation and this knowledge is not always available, e.g., because of intelectual property issues. This problem may be avoided by estimating statistically either the WCET of a task [18] or the response time of each task [37] .

Multiprocessor scheduling of real-time systems with probabilistic execution times

Participants : Liliana Cucu-Grosjean, Joel Goossens [Université Libre de Bruxelles] .

After providing exact feasibility tests for the case of arbitrary tasks on unrelated processor in [12] , we have proposed feasibility tests for tasks wih probabilistic execution times [34] . These tests are based on intervals that are proved to contain the highest probability of having tasks with deadline missed.

Probabilistic Component-based Approaches

Participants : Luca Santinelli, Patrick Meumeu Yomsi, Dorin Maxim, Liliana Cucu-Grosjean.

We have proposed a probabilistic component-based model which abstracts in the interfaces both the functional and non-functional requirements of such systems. This approach allows designers to unify in the same framework probabilistic scheduling techniques and compositional guarantees that go from soft to hard real-time. We have provided sufficient schedulability tests for task systems using such framework when the scheduler is either preemptive fixed-priority or earliest deadline first. These results were published in [35] .

Mixed-criticality problems for probabilistic real-time systems

Participants : Bader Alahmad, Luca Santinelli, Liliana Cucu-Grosjean, Sathish Gopalakrishnan [University of British Columbia] .

Critical embedded systems (CESs) face the need of new functionalities imposed by the end users. These new functionalities of CESs impose the utilization of complex architectures. The complex architectures increase the time variability of programs and this coupled with worst-case reasoning implies over-provisioned systems. Avoiding such over-provision became an important problem within CESs. One model answering such problem in the mixed-criticality problem. It is natural then to combine mixed-criticality with probabilistic approaches known to decrease the over-provision by taking into account the information that worst-case situations have low probability of occurrence. We have proposed and contrasted in [19] two probabilistic execution-behavior models for mixed-criticality independent job systems as they execute on a single machine. The models differ in both the system assumptions and the amount of job information they offer and exploit. While one model is compliant with the current standard practice of fixing jobs’ criticalities, the other is a proposal to treat job criticalities as random entities with predetermined probabilities of jobs being of certain criticalities throughout the lifetime of the system.

Energy optimization for real-time systems

Participants : Cristian Maxim, Liliana Cucu-Grosjean, Olivier Zendra.

Many embedded real-time systems integrate battery operated microprocessor systems with limited battery autonomy. Minimizing energy consumption is thus crucial. We have proposed in [28] an algorithm that improves energy consumption in real-time systems by combining Dynamic Voltage Scaling and a decrease in the number of preemptions. Our overall purpose is to focus on a specific part of the problem, namely selectively increasing frequency to lower the number of preemptions of a task to try and decrease the total energy consumption.